search-and-rescue mission
Robot Talk Episode 142 – Collaborative robot arms, with Mark Gray
Mark Gray has worked in automation for the last 30 years, first involved in machine vision and robotics and finally collaborative robots or cobots. As country manager, Mark was the first person to work for Universal Robots in the UK and has carried out projects with many research institutes such as the Advanced Manufacturing Research Centre (AMRC), The Manufacturing Technology Centre (MTC), the National Robotarium, and Bristol Robotics Lab. Robot Talk is a weekly podcast that explores the exciting world of robotics, artificial intelligence and autonomous machines. Robot Talk is a weekly podcast that explores the exciting world of robotics, artificial intelligence and autonomous machines. In the latest episode of the Robot Talk podcast, Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms.
- Europe > United Kingdom (0.26)
- North America > United States > Texas (0.08)
Robot Talk Episode 141 – Our relationship with robot swarms, with Razanne Abu-Aisheh
Claire chatted to Razanne Abu-Aisheh from the University of Bristol about how people feel about interacting with robot swarms. Razanne Abu-Aisheh is a Senior Research Associate in the Centre for Sociodigital Futures at the University of Bristol. Her work explores how people interact with robot swarms, with a focus on how collective robot behaviours influence human perception. In her current research, she collaborates with communities to imagine more inclusive and meaningful futures with robotics, working towards community-centred design. Her broader interests include bringing robot swarms into real-world settings and designing them with people in mind.
Vine-inspired robotic gripper gently lifts heavy and fragile objects
In the horticultural world, some vines are especially grabby. As they grow, the woody tendrils can wrap around obstacles with enough force to pull down entire fences and trees. Inspired by vines' twisty tenacity, engineers at MIT and Stanford University have developed a robotic gripper that can snake around and lift a variety of objects, including a glass vase and a watermelon, offering a gentler approach compared to conventional gripper designs. A larger version of the robo-tendrils can also safely lift a human out of bed. The new bot consists of a pressurized box, positioned near the target object, from which long, vine-like tubes inflate and grow, like socks being turned inside out.
- North America > United States > Texas (0.05)
- North America > United States > Florida > Alachua County > Gainesville (0.05)
- Health & Medicine (0.49)
- Leisure & Entertainment > Sports > Soccer (0.30)
Robot Talk Episode 140 – Robot balance and agility, with Amir Patel
Amir Patel is an Associate Professor of Robotics & AI in the Department of Computer Science at University College London (UCL). His research uses robotics methods--sensor fusion, computer vision, mechanical modelling, and optimal control--to understand and quantify animal locomotion, especially high-speed predators such as the cheetah, and to translate these insights into bio-inspired machines. Previously, he served on the faculty of Electrical Engineering at the University of Cape Town, where he founded and directed the African Robotics Unit (ARU). Robot Talk is a weekly podcast that explores the exciting world of robotics, artificial intelligence and autonomous machines. Robot Talk is a weekly podcast that explores the exciting world of robotics, artificial intelligence and autonomous machines.
- Africa > South Africa > Western Cape > Cape Town (0.26)
- North America > United States > Texas (0.08)
Robot Talk Episode 139 – Advanced robot hearing, with Christine Evers
Claire chatted to Christine Evers from University of Southampton about helping robots understand the world around them through sound. Christine Evers is an Associate Professor in Computer Science and Director of the Centre for Robotics at the University of Southampton. Her research pushes the boundaries of machine listening, enabling robots to make sense of life in sound. Her current focus is embedding our understanding of the human auditory process into deep-learning audio architectures. This bio-inspired approach moves away from massive, internet-scale models toward compute-efficient and inherently interpretable systems - opening the door to a new generation of embodied auditory intelligence.
- North America > United States > Texas (0.08)
- Europe > Switzerland > Zürich > Zürich (0.06)
Meet the AI-powered robotic dog ready to help with emergency response
Developed by Texas A&M University engineering students, this AI-powered robotic dog doesn't just follow commands. Designed to navigate chaos with precision, the robot could help revolutionize search-and-rescue missions, disaster response and many other emergency operations. Sandun Vitharana, an engineering technology master's student, and Sanjaya Mallikarachchi, an interdisciplinary engineering doctoral student, spearheaded the invention of the robotic dog. It can process voice commands and uses AI and camera input to perform path planning and identify objects. A roboticist would describe it as a terrestrial robot that uses a memory-driven navigation system powered by a multimodal large language model (MLLM).
- North America > United States > Texas (0.28)
- North America > United States > Ohio (0.05)
- Europe > Switzerland > Zürich > Zürich (0.05)
- Asia > Kazakhstan (0.05)
A flexible lens controlled by light-activated artificial muscles promises to let soft machines see
Inspired by the human eye, our biomedical engineering lab at Georgia Tech has designed an adaptive lens made of soft, light-responsive, tissuelike materials. Adjustable camera systems usually require a set of bulky, moving, solid lenses and a pupil in front of a camera chip to adjust focus and intensity. In contrast, human eyes perform these same functions using soft, flexible tissues in a highly compact form. Our lens, called the photo-responsive hydrogel soft lens, or PHySL, replaces rigid components with soft polymers acting as artificial muscles. The polymers are composed of a hydrogel a water-based polymer material.
- Health & Medicine > Health Care Technology (0.37)
- Leisure & Entertainment > Sports > Soccer (0.32)
- Information Technology > Communications (1.00)
- Information Technology > Artificial Intelligence > Robots (1.00)
Scaling 3D Reasoning with LMMs to Large Robot Mission Environments Using Datagraphs
Meijer, W. J., Kemmeren, A. C., Riemens, E. H. J., Fransman, J. E., van Bekkum, M., Burghouts, G. J., van Mil, J. D.
This paper addresses the challenge of scaling Large Multimodal Models (LMMs) to expansive 3D environments. Solving this open problem is especially relevant for robot deployment in many first-responder scenarios, such as search-and-rescue missions that cover vast spaces. The use of LMMs in these settings is currently hampered by the strict context windows that limit the LMM's input size. We therefore introduce a novel approach that utilizes a datagraph structure, which allows the LMM to iteratively query smaller sections of a large environment. Using the datagraph in conjunction with graph traversal algorithms, we can prioritize the most relevant locations to the query, thereby improving the scalability of 3D scene language tasks. We illustrate the datagraph using 3D scenes, but these can be easily substituted by other dense modalities that represent the environment, such as pointclouds or Gaussian splats. We demonstrate the potential to use the datagraph for two 3D scene language task use cases, in a search-and-rescue mission example.
- Europe > Netherlands > South Holland > Delft (0.04)
- Asia > South Korea > Seoul > Seoul (0.04)
- Asia > Middle East > Palestine > West Bank > Bethlehem Governorate (0.04)
- Asia > Japan > Honshū > Chūbu > Ishikawa Prefecture > Kanazawa (0.04)
Insect-Sized Light-Emitting Robots can now Make Rescue Operations Easier
Engineers at the Massachusetts Institute of Technology have developed robotic lightning bugs that emit light when they fly. Think fireflies, but with what the robot's creators call electroluminescent soft artificial muscles for flying. Their tiny artificial muscles control the robot's wings and emit colored light during flight, which provides a low-cost way to track the robots and also could enable them to communicate. Their light, which offers a way for researchers to track the robots, could one day make the machines useful for search-and-rescue missions. In dangerous locations, the robots could signal for help by flashing their lights.
MIT scientists create robotic FIREFLIES that could help search-and-rescue missions
Tiny robotic fireflies that weigh barely more than a paper clip and glow as they fly could be used to aid search-and-rescue missions, researchers claim. Engineers at MIT previously developed insect-sized robots with tiny artificial muscles that allow them to zip around with bug-like agility by rapidly flapping their wings. The engineers have now found a way to embed minuscule electroluminescent particles into these artificial muscles, meaning they emit coloured light during flight. The robots can use this light to communicate with each other, and could even use it to signal for help in emergency situations, according to the researchers. For example, if sent on a search-and-rescue mission into a collapsed building, a robot that finds survivors could use lights to signal others and call for help.